Skip to content

Adding SGD, Adam and RMSProp optimizers#42

Merged
umar456 merged 2 commits intomasterfrom
optimizers
Aug 10, 2017
Merged

Adding SGD, Adam and RMSProp optimizers#42
umar456 merged 2 commits intomasterfrom
optimizers

Conversation

@pavanky
Copy link
Member

@pavanky pavanky commented Aug 9, 2017

@pavanky pavanky added this to the 0.1 milestone Aug 9, 2017
@pavanky pavanky requested a review from umar456 August 9, 2017 08:41
- Moved zeroGrad to be part of Optimizer class
- Renamed perceptron.cpp to xor.cpp
- Modified xor example to run with SGD, Adam, or RMSProp optimizers
@umar456 umar456 merged commit 7320d86 into master Aug 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants